Supplementary Material: Sequential Inference for Deep Gaussian Process
نویسندگان
چکیده
In this section we briefly review sparse online GPs (GPso) [1, 2]. The key idea is to learn GPs recursively by updating the posterior mean and covariance of the training set {(x, y)}n=1 in a sequential fashion. This online procedure is coupled with a sparsification mechanism in which a fixed-size subset of the training set (called the active set) is iteratively selected to avoid the unbounded computation growth of updating.
منابع مشابه
Sequential Inference for Deep Gaussian Process
A deep Gaussian process (DGP) is a deep network in which each layer is modelled with a Gaussian process (GP). It is a flexible model that can capture highly-nonlinear functions for complex data sets. However, the network structure of DGP often makes inference computationally expensive. In this paper, we propose an efficient sequential inference framework for DGP, where the data is processed seq...
متن کاملGaussian Process Regression Networks Supplementary Material
In this supplementary material, we discuss some further details of our ESS and VB inference (Sections 1 and 2), the computational complexity of our inference procedures (Section 3), and the correlation structure induced by the GPRN model (Section 4). We also discuss multimodality in the GPRN posterior (Section 5), SVLMC, and some background information and notation for Gaussian process regressi...
متن کاملHigh Resolution Shape Completion Using Deep Neural Networks for Global Structure and Local Geometry Inference (Supplementary Material)
متن کامل
Supplementary Material: Memoized Online Variational Inference for Dirichlet Process Mixture Models
This document contains supplementary mathematics and algorithm descriptions to help readers understand our new learning algorithm. First, in Sec. 1 we offer detailed model description and update equations for a DP-GMM with zero-mean, full-covariance Gaussian likelihood. Second, in Sec. 2 we provide step-by-step discussion of our birth move algorithm, providing a level-of-detail at which the int...
متن کاملRecurrent Gaussian Processes
We define Recurrent Gaussian Processes (RGP) models, a general family of Bayesian nonparametric models with recurrent GP priors which are able to learn dynamical patterns from sequential data. Similar to Recurrent Neural Networks (RNNs), RGPs can have different formulations for their internal states, distinct inference methods and be extended with deep structures. In such context, we propose a ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2016